Laryngeal Pressure Estimation With a Recurrent Neural Network
نویسندگان
چکیده
منابع مشابه
Translation Quality Estimation using Recurrent Neural Network
This paper describes our submission to the shared task on word/phrase level Quality Estimation (QE) in the First Conference on Statistical Machine Translation (WMT16). The objective of the shared task was to predict if the given word/phrase is a correct/incorrect (OK/BAD) translation in the given sentence. In this paper, we propose a novel approach for word level Quality Estimation using Recurr...
متن کاملRecurrent Neural Network based Translation Quality Estimation
This paper describes the recurrent neural network based model for translation quality estimation. Recurrent neural network based quality estimation model consists of two parts. The first part using two bidirectional recurrent neural networks generates the quality information about whether each word in translation is properly translated. The second part using another recurrent neural network pre...
متن کاملA Recurrent Neural Network Model for Solving Linear Semidefinite Programming
In this paper we solve a wide rang of Semidefinite Programming (SDP) Problem by using Recurrent Neural Networks (RNNs). SDP is an important numerical tool for analysis and synthesis in systems and control theory. First we reformulate the problem to a linear programming problem, second we reformulate it to a first order system of ordinary differential equations. Then a recurrent neural network...
متن کاملCCG Supertagging with a Recurrent Neural Network
Recent work on supertagging using a feedforward neural network achieved significant improvements for CCG supertagging and parsing (Lewis and Steedman, 2014). However, their architecture is limited to considering local contexts and does not naturally model sequences of arbitrary length. In this paper, we show how directly capturing sequence information using a recurrent neural network leads to f...
متن کاملN-gram Language Modeling using Recurrent Neural Network Estimation
We investigate the effective memory depth of RNN models by using them for n-gram language model (LM) smoothing. Experiments on a small corpus (UPenn Treebank, one million words of training data and 10k vocabulary) have found the LSTM cell with dropout to be the best model for encoding the n-gram state when compared with feed-forward and vanilla RNN models. When preserving the sentence independe...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Journal of Translational Engineering in Health and Medicine
سال: 2019
ISSN: 2168-2372
DOI: 10.1109/jtehm.2018.2886021